Sparse Non-negative Matrix Factorization with Generalized Kullback-Leibler Divergence
نویسندگان
چکیده
Non-negative Matrix Factorization (NMF), especially with sparseness constraints, plays a critically important role in data engineering and machine learning. Hoyer (2004) presented an algorithm to compute NMF with exact sparseness constraints. The exact sparseness constraints depends on a projection operator. In the present work, we first give a very simple counterexample, for which the projection operator of the Hoyer (2004) algorithm fails. After analysing the reason geometrically, we fix this bug by adding some random terms and show that the fixed one works correctly. Based on the fixed projection operator, we propose another sparse NMF algorithm aiming at optimizing the generalized Kullback-Leibler divergence, hence named SNMF-GKLD. Experimental results show that SNMF-GKLD not only has similar effects with Hoyer (2004) on the same data sets, but is also efficient.
منابع مشابه
Non-negative matrix factorization for visual coding
This paper combines linear sparse coding and nonnegative matrix factorization into sparse non-negative matrix factorization. In contrast to non-negative matrix factorization, the new model can leam much sparser representation via imposing sparseness constraints explicitly; in contrast to a close model non-negative sparse coding, the new model can learn parts-based representation via fully multi...
متن کاملNon-negative matrix factorization with fixed row and column sums
In this short note, we focus on the use of the generalized Kullback–Leibler (KL) divergence in the problem of non-negative matrix factorization (NMF). We will show that when using the generalized KL divergence as cost function for NMF, the row sums and the column sums of the original matrix are preserved in the approximation. We will use this special characteristic in several approximation prob...
متن کاملProjective Nonnegative Matrix Factorization with α-Divergence
A new matrix factorization algorithm which combines two recently proposed nonnegative learning techniques is presented. Our new algorithm, α-PNMF, inherits the advantages of Projective Nonnegative Matrix Factorization (PNMF) for learning a highly orthogonal factor matrix. When the Kullback-Leibler (KL) divergence is generalized to αdivergence, it gives our method more flexibility in approximati...
متن کاملFast Parallel Randomized Algorithm for Nonnegative Matrix Factorization with KL Divergence for Large Sparse Datasets
Nonnegative Matrix Factorization (NMF) with Kullback-Leibler Divergence (NMF-KL) is one of the most significant NMF problems and equivalent to Probabilistic Latent Semantic Indexing (PLSI), which has been successfully applied in many applications. For sparse count data, a Poisson distribution and KL divergence provide sparse models and sparse representation, which describe the random variation ...
متن کاملCsiszár's Divergences for Non-negative Matrix Factorization: Family of New Algorithms
In this paper we discus a wide class of loss (cost) functions for non-negative matrix factorization (NMF) and derive several novel algorithms with improved efficiency and robustness to noise and outliers. We review several approaches which allow us to obtain generalized forms of multiplicative NMF algorithms and unify some existing algorithms. We give also the flexible and relaxed form of the N...
متن کامل